Kullback-Leibler divergence for interacting multiple model estimation with random matrices
نویسندگان
چکیده
This paper studies the problem of interacting multiple model (IMM) estimation for jump Markov linear systems with unknown measurement noise covariance. The system state and the unknown covariance are jointly estimated in the framework of Bayesian estimation, where the unknown covariance is modeled as a random matrix according to an inverse-Wishart distribution. For the IMM estimation with random matrices, one difficulty encountered is the combination of a set of weighted inverse-Wishart distributions. Instead of using the moment matching approach, this difficulty is overcome by minimizing the weighted Kullback-Leibler divergence for inverse-Wishart distributions. It is shown that a closed form solution can be derived for the optimization problem and the resulting solution coincides with an inverse-Wishart distribution. Simulation results show that the proposed filter performs better than the previous work using the moment matching approach. Index Terms Interacting multiple model, Kullback-Leibler divergence, Random matrix, Jump Markov system
منابع مشابه
Model Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملComparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملKullback-Leibler divergence and Markov random fields for speckled image restoration
In this paper we describe an approximation of speckled image observation (attachment to data) laws by generalized gaussian pdfs. We use Kullback-Leibler (KL) divergence (entropy) for this purpose. This leads to a mathematical model which can be useful for speckled image restoration and for related hyperparamater estimation.
متن کاملEntropy and Kullback-Leibler divergence estimation based on Szegö's theorem
In this work, a new technique for the estimation of the Shannon’s entropy and the Kullback-Leibler (KL) divergence for one dimensional data is presented. The estimator is based on the Szegö’s theorem for sequences of Toeplitz matrices, which deals with the asymptotic behavior of the eigenvalues of those matrices, and the analogy between a probability density function (PDF) and a power spectral ...
متن کاملLower bounds for volatility estimation in microstructure noise models∗
In this paper minimax lower bounds are derived for the estimation of the instantaneous volatility in three related high-frequency statistical models. These bounds are based on new upper bounds for the Kullback-Leibler divergence between two multivariate normal random variables along with a spectral analysis of the processes. A comparison with known upper bounds shows that these lower bounds are...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1411.1284 شماره
صفحات -
تاریخ انتشار 2014